Goto

Collaborating Authors

 deep signal propagation


Critical initialisation for deep signal propagation in noisy rectifier neural networks

Neural Information Processing Systems

Stochastic regularisation is an important weapon in the arsenal of a deep learning practitioner. However, despite recent theoretical advances, our understanding of how noise influences signal propagation in deep neural networks remains limited. By extending recent work based on mean field theory, we develop a new framework for signal propagation in stochastic regularised neural networks. Our \textit{noisy signal propagation} theory can incorporate several common noise distributions, including additive and multiplicative Gaussian noise as well as dropout. We use this framework to investigate initialisation strategies for noisy ReLU networks. We show that no critical initialisation strategy exists using additive noise, with signal propagation exploding regardless of the selected noise distribution. For multiplicative noise (e.g.\ dropout), we identify alternative critical initialisation strategies that depend on the second moment of the noise distribution. Simulations and experiments on real-world data confirm that our proposed initialisation is able to stably propagate signals in deep networks, while using an initialisation disregarding noise fails to do so.


Reviews: Critical initialisation for deep signal propagation in noisy rectifier neural networks

Neural Information Processing Systems

This paper analyzes signal propagation in vanilla fully-connected neural networks in the presence of noise. For ReLU networks, it concludes that the initial weight variance should be adjusted from the "He" initialization to account for the noise scale. Various empirical simulations corroborate this claim. Generally speaking, I believe studying signal propagation in random neural networks is a powerful way to build better initialization schemes and examining signal propagation in the presence of noise is an interesting direction. The paper is well-written and easy to read.


Critical initialisation for deep signal propagation in noisy rectifier neural networks

Pretorius, Arnu, Biljon, Elan van, Kroon, Steve, Kamper, Herman

Neural Information Processing Systems

Stochastic regularisation is an important weapon in the arsenal of a deep learning practitioner. However, despite recent theoretical advances, our understanding of how noise influences signal propagation in deep neural networks remains limited. By extending recent work based on mean field theory, we develop a new framework for signal propagation in stochastic regularised neural networks. Our \textit{noisy signal propagation} theory can incorporate several common noise distributions, including additive and multiplicative Gaussian noise as well as dropout. We use this framework to investigate initialisation strategies for noisy ReLU networks.